The precise calculation of atmospheric absorption in a microwave band is highly important for atmospheric remote-sensing with ground-based and satellite-borne radiometers, as it is a key element in procedures for temperature, humidity or trace gas concentration retrieval. The accuracy of the absorption model directly affects the accuracy of the retrieved information and reliability of the resulting forecasts. In this study, we analyze the difference between observed and simulated microwave spectra obtained from more than four years of microwave and radiosonde observations over Nizhny Novgorod (56.2◦N, 44◦E).We focus on zenith-measured microwave data in the 20–60 GHz frequency range in clear-sky conditions. The use of a conventional absorption model in simulations leads to a significant difference in frequency channels within the 51–54 GHz range, while calculations employing a more accurate model based on the Energy Corrected Sudden (ECS) formalism for molecular oxygen absorption reduces the difference several-fold.
Loading....